Current:Home > MyAn eating disorders chatbot offered dieting advice, raising fears about AI in health -Stellar Wealth Sphere
An eating disorders chatbot offered dieting advice, raising fears about AI in health
View
Date:2025-04-27 07:05:14
A few weeks ago, Sharon Maxwell heard the National Eating Disorders Association (NEDA) was shutting down its long-running national helpline and promoting a chatbot called Tessa as a "a meaningful prevention resource" for those struggling with eating disorders. She decided to try out the chatbot herself.
Maxwell, who is based in San Diego, had struggled for years with an eating disorder that began in childhood. She now works as a consultant in the eating disorder field. "Hi, Tessa," she typed into the online text box. "How do you support folks with eating disorders?"
Tessa rattled off a list of ideas, including some resources for "healthy eating habits." Alarm bells immediately went off in Maxwell's head. She asked Tessa for more details. Before long, the chatbot was giving her tips on losing weight - ones that sounded an awful lot like what she'd been told when she was put on Weight Watchers at age 10.
"The recommendations that Tessa gave me was that I could lose 1 to 2 pounds per week, that I should eat no more than 2,000 calories in a day, that I should have a calorie deficit of 500-1,000 calories per day," Maxwell says. "All of which might sound benign to the general listener. However, to an individual with an eating disorder, the focus of weight loss really fuels the eating disorder."
Maxwell shared her concerns on social media, helping launch an online controversy which led NEDA to announce on May 30 that it was indefinitely disabling Tessa. Patients, families, doctors and other experts on eating disorders were left stunned and bewildered about how a chatbot designed to help people with eating disorders could end up dispensing diet tips instead.
The uproar has also set off a fresh wave of debate as companies turn to artificial intelligence (AI) as a possible solution to a surging mental health crisis and severe shortage of clinical treatment providers.
A chatbot suddenly in the spotlight
NEDA had already come under scrutiny after NPR reported on May 24 that the national nonprofit advocacy group was shutting down its helpline after more than 20 years of operation.
CEO Liz Thompson informed helpline volunteers of the decision in a March 31 email, saying NEDA would "begin to pivot to the expanded use of AI-assisted technology to provide individuals and families with a moderated, fully automated resource, Tessa."
"We see the changes from the Helpline to Tessa and our expanded website as part of an evolution, not a revolution, respectful of the ever-changing landscape in which we operate."
(Thompson followed up with a statement on June 7, saying that in NEDA's "attempt to share important news about separate decisions regarding our Information and Referral Helpline and Tessa, that the two separate decisions may have become conflated which caused confusion. It was not our intention to suggest that Tessa could provide the same type of human connection that the Helpline offered.")
On May 30, less than 24 hours after Maxwell provided NEDA with screenshots of her troubling conversation with Tessa, the non-profit announced it had "taken down" the chatbot "until further notice."
NEDA says it didn't know chatbot could create new responses
NEDA blamed the chatbot's emergent issues on Cass, a mental health chatbot company that operated Tessa as a free service. Cass had changed Tessa without NEDA's awareness or approval, according to CEO Thompson, enabling the chatbot to generate new answers beyond what Tessa's creators had intended.
"By design it, it couldn't go off the rails," says Ellen Fitzsimmons-Craft, a clinical psychologist and professor at Washington University Medical School in St. Louis. Craft helped lead the team that first built Tessa with funding from NEDA.
The version of Tessa that they tested and studied was a rule-based chatbot, meaning it could only use a limited number of prewritten responses. "We were very cognizant of the fact that A.I. isn't ready for this population," she says. "And so all of the responses were pre-programmed."
The founder and CEO of Cass, Michiel Rauws, told NPR the changes to Tessa were made last year as part of a "systems upgrade," including an "enhanced question and answer feature." That feature uses generative Artificial Intelligence, meaning it gives the chatbot the ability to use new data and create new responses.
That change was part of NEDA's contract, Rauws says.
But NEDA's CEO Liz Thompson told NPR in an email that "NEDA was never advised of these changes and did not and would not have approved them."
"The content some testers received relative to diet culture and weight management can be harmful to those with eating disorders, is against NEDA policy, and would never have been scripted into the chatbot by eating disorders experts, Drs. Barr Taylor and Ellen Fitzsimmons Craft," she wrote.
Complaints about Tessa started last year
NEDA was already aware of some issues with the chatbot months before Sharon Maxwell publicized her interactions with Tessa in late May.
In October 2022, NEDA passed along screenshots from Monika Ostroff, executive director of the Multi-Service Eating Disorders Association (MEDA) in Massachusetts.
They showed Tessa telling Ostroff to avoid "unhealthy" foods and only eat "healthy" snacks, like fruit. "It's really important that you find what healthy snacks you like the most, so if it's not a fruit, try something else!" Tessa told Ostroff. "So the next time you're hungry between meals, try to go for that instead of an unhealthy snack like a bag of chips. Think you can do that?"
In a recent interview, Ostroff says this was a clear example of the chatbot encouraging "diet culture" mentality. "That meant that they [NEDA] either wrote these scripts themselves, they got the chatbot and didn't bother to make sure it was safe and didn't test it, or released it and didn't test it," she says.
The healthy snack language was quickly removed after Ostroff reported it. But Rauws says that problematic language was part of Tessa's "pre-scripted language, and not related to generative AI."
Fitzsimmons-Craft denies her team wrote that. "[That] was not something our team designed Tessa to offer and... it was not part of the rule-based program we originally designed."
Then, earlier this year, Rauws says "a similar event happened as another example."
"This time it was around our enhanced question and answer feature, which leverages a generative model. When we got notified by NEDA that an answer text [Tessa] provided fell outside their guidelines, and it was addressed right away."
Rauws says he can't provide more details about what this event entailed.
"This is another earlier instance, and not the same instance as over the Memorial Day weekend," he said in an email, referring to Maxwell's screenshots. "According to our privacy policy, this is related to user data tied to a question posed by a person, so we would have to get approval from that individual first."
When asked about this event, Thompson says she doesn't know what instance Rauws is referring to.
Despite their disagreements over what happened and when, both NEDA and Cass have issued apologies.
Ostroff says regardless of what went wrong, the impact on someone with an eating disorder is the same. "It doesn't matter if it's rule-based [AI] or generative, it's all fat-phobic," she says. "We have huge populations of people who are harmed by this kind of language everyday."
She also worries about what this might mean for the tens of thousands of people who were turning to NEDA's helpline each year.
"Between NEDA taking their helpline offline, and their disastrous chatbot....what are you doing with all those people?"
Thompson says NEDA is still offering numerous resources for people seeking help, including a screening tool and resource map, and is developing new online and in-person programs.
"We recognize and regret that certain decisions taken by NEDA have disappointed members of the eating disorders community," she said in an emailed statement. "Like all other organizations focused on eating disorders, NEDA's resources are limited and this requires us to make difficult choices... We always wish we could do more and we remain dedicated to doing better."
veryGood! (976)
Related
- Moving abroad can be expensive: These 5 countries will 'pay' you to move there
- Warming Trends: Chilling in a Heat Wave, Healthy Food Should Eat Healthy Too, Breeding Delays for Wild Dogs, and Three Days of Climate Change in Song
- Gwyneth Paltrow Poses Topless in Poolside Selfie With Husband Brad Falchuk
- Gymshark's Huge Summer Sale Is Here: Score 60% Off Cult Fave Workout Essentials
- North Carolina justices rule for restaurants in COVID
- How the Fed got so powerful
- The Oakland A's are on the verge of moving to Las Vegas
- What went wrong at Silicon Valley Bank? The Fed is set to release a postmortem report
- California DMV apologizes for license plate that some say mocks Oct. 7 attack on Israel
- New York’s ‘Deliveristas’ Are at the Forefront of Cities’ Sustainable Transportation Shake-up
Ranking
- Where will Elmo go? HBO moves away from 'Sesame Street'
- Despite GOP Gains in Virginia, the State’s Landmark Clean Energy Law Will Be Hard to Derail
- Charlie Puth Blasts Trend of Throwing Objects at Performers After Kelsea Ballerini's Onstage Incident
- A group of state AGs calls for a national recall of high-theft Hyundai, Kia vehicles
- Former longtime South Carolina congressman John Spratt dies at 82
- Inside Clean Energy: Who’s Ahead in the Race for Offshore Wind Jobs in the US?
- Hailey Bieber Responds to Criticism She's Not Enough of a Nepo Baby
- A Biomass Power Plant in Rural North Carolina Reignites Concerns Over Clean Energy and Environmental Justice
Recommendation
The FTC says 'gamified' online job scams by WhatsApp and text on the rise. What to know.
In North Carolina Senate Race, Global Warming Is On The Back Burner. Do Voters Even Care?
Twitter removes all labels about government ties from NPR and other outlets
1000-Lb Sisters Star Tammy Slaton Mourns Death of Husband Caleb Willingham at 40
Finally, good retirement news! Southwest pilots' plan is a bright spot, experts say
Twitter once muzzled Russian and Chinese state propaganda. That's over now
In South Asia, Vehicle Exhaust, Agricultural Burning and In-Home Cooking Produce Some of the Most Toxic Air in the World
Well, It's Still Pride Is Reason Enough To Buy These 25 Rainbow Things